15 research outputs found

    FaceQnet: Quality Assessment for Face Recognition based on Deep Learning

    Full text link
    In this paper we develop a Quality Assessment approach for face recognition based on deep learning. The method consists of a Convolutional Neural Network, FaceQnet, that is used to predict the suitability of a specific input image for face recognition purposes. The training of FaceQnet is done using the VGGFace2 database. We employ the BioLab-ICAO framework for labeling the VGGFace2 images with quality information related to their ICAO compliance level. The groundtruth quality labels are obtained using FaceNet to generate comparison scores. We employ the groundtruth data to fine-tune a ResNet-based CNN, making it capable of returning a numerical quality measure for each input image. Finally, we verify if the FaceQnet scores are suitable to predict the expected performance when employing a specific image for face recognition with a COTS face recognition system. Several conclusions can be drawn from this work, most notably: 1) we managed to employ an existing ICAO compliance framework and a pretrained CNN to automatically label data with quality information, 2) we trained FaceQnet for quality estimation by fine-tuning a pre-trained face recognition network (ResNet-50), and 3) we have shown that the predictions from FaceQnet are highly correlated with the face recognition accuracy of a state-of-the-art commercial system not used during development. FaceQnet is publicly available in GitHub.Comment: Preprint version of a paper accepted at ICB 201

    Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    Get PDF
    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a necessary asset for conducting validation experiments when validating LR methods used in forensic evidence evaluation and set up validation reports. These data can be also used as a baseline for comparing the fingermark evidence in the same minutiae configuration as presented in (D. Meuwly, D. Ramos, R. Haraksim,) [1], although the reader should keep in mind that different feature extraction algorithms and different AFIS systems used may produce different LRs values. Moreover, these data may serve as a reproducibility exercise, in order to train the generation of validation reports of forensic methods, according to [1]. Alongside the data, a justification and motivation for the use of methods is given. These methods calculate LRs from the fingerprint/mark data and are subject to a validation procedure. The choice of using real forensic fingerprint in the validation and simulated data in the development is described and justified. Validation criteria are set for the purpose of validation of the LR methods, which are used to calculate the LR values from the data and the validation report. For privacy and data protection reasons, the original fingerprint/mark images cannot be shared. But these images do not constitute the core data for the validation, contrarily to the LRs that are shared.The research was conducted in scope of the BBfor2 – European Commission Marie Curie Initial Training Network (FP7-PEOPLE-ITN 2008 under Grant Agreement 238803) in cooperation with The Netherlands Forensic Institute and the ATVS Biometric Recognition Group at the Universidad Autonoma de Madrid

    Measuring coherence of computer-assisted likelihood ratio methods

    Full text link
    This is the author’s version of a work that was accepted for publication in Forensic Science International. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Forensic Science International, 249 (2015): 123 – 132 DOI:10.1016/j.forsciint.2015.01.033Measuring the performance of forensic evaluation methods that compute likelihood ratios (LRs) is relevant for both the development and the validation of such methods. A framework of performance characteristics categorized as primary and secondary is introduced in this study to help achieve such development and validation. Ground-truth labelled fingerprint data is used to assess the performance of an example likelihood ratio method in terms of those performance characteristics. Discrimination, calibration, and especially the coherence of this LR method are assessed as a function of the quantity and quality of the trace fingerprint data. Assessment of the coherence revealed a weakness of the comparison algorithm in the computer-assisted likelihood ratio method used.This research was conducted in the scope of the BBfor2 – European Commission Marie Curie Initial Training Network (FP7-PEOPLE-ITN-2008 under Grant Agreement 238803) at the Netherlands Forensic Institute, and in collaboration with the ATVS Biometric Recognition Group at the Universidad Autonoma de Madrid and the National Police Services Agency of the Netherlands

    Automatic fingerprint recognition: from children to elderly Ageing and age effects

    Get PDF
    By courtesy of the Portuguese Government, DG JRC has received a comprehensive set of fingerprint data from individuals aged 0-25 and 65-98. The main purpose of the proposed experiments is to deepen the understanding regarding the physiological development of the fingertip ridge structure over time and its impact on automated fingerprint recognition. The experiments explore three biometric processes in the light of age, ageing and growth effects. These effects are demonstrated and validated. A growth model is also developed and validated. The report concludes with a series of recommendations for enhanced implementation of automated fingerprint recognition system and suggestions for further researches.JRC.E.3-Cyber and Digital Citizens' Securit

    Validation of likelihood ratio methods used for forensic evidence evaluation: application in forensic fingerprints

    Get PDF
    In this chapter the Likelihood Ratio (LR) inference model will be introduced, the theoretical aspects of probabilities will be discussed and the validation framework for LR methods used for forensic evidence evaluation will be presented. Prior to introducing the validation framework, following questions will be addressed: “which aspects of a forensic evaluation scenario need to be validated?‿, “what is the role of the LR as part of a decision process?‿ and “how to deal with uncertainty in the LR calculation?‿ The answers to these questions are necessary to define the validation strategy based on the validation criteria. The questions: “what to validate?‿ focusing on defining validation criteria and methods, and “how to validate?‿ dealing with the implementation of a validation protocol, form the core of this chapter. The validation framework described can be used to provide assistance to the forensic practitioners, when determining the suitability and applicability of the LR method developed in the forensic practice by introducing performance characteristics, performance metrics, validation criteria and the decision thresholds. This chapter will start with the introduction of the LR inference model, followed by the validation framework proposed

    Study on Fingermark and Palmmark Identification Technologies for its Implementation in the Schengen Information System

    No full text
    The report assesses the technology readiness and availability of new functionalities – based on automatic fingermark and palmmark recognition technologies – for their integration into the Schengen Information System (SIS). These functionalities have been introduced in the revised SIS Regulations adopted on 28th of November 2018, both in the context of police and judicial cooperation. The report is structured in two parts. In Part I, the automatic fingermark and palmmark recognition functionalities are introduced together with a review of the latest developments and state of the art, quality metrics and important biometric standards are provided; and it is concluded with a summary section entitled “lessons learnt”. In Part II, the functionalities are placed into the context of Schengen Information System. Use-cases for border control and police and judiciary cooperation in criminal matters are presented, and a list of recommendations for the successful implementation of fingermark and palmmark processing technologies into the Schengen Information System are provided.JRC.E.3-Cyber and Digital Citizens' Securit

    A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation

    Get PDF
    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes’ inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of the evidence for a trace specimen, e.g. a fingermark, and a reference specimen, e.g. a fingerprint, to originate from common or different sources. Some theoretical aspects of probabilities necessary for this Guideline were discussed prior to its elaboration, which started after a workshop of forensic researchers and practitioners involved in this topic. In the workshop, the following questions were addressed: ‘‘which aspects of a forensic evaluation scenario need to be validated?’’, ‘‘what is the role of the LR as part of a decision process?’’ and ‘‘how to deal with uncertainty in the LR calculation?’’. The questions: ‘‘what to validate?’’ focuses on the validation methods and criteria and ‘‘how to validate?’’ deals with the implementation of the validation protocol. Answers to these questions were deemed necessary with several objectives. First, concepts typical for validation standards [1], such as performance characteristics, performance metrics and validation criteria, will be adapted or applied by analogy to the LR framework. Second, a validation strategy will be defined. Third, validation methods will be described. Finally, a validation protocol and an example of validation report will be proposed, which can be applied to the forensic fields developing and validating LR methods for the evaluation of the strength of evidence at source level under the following propositions: H1/Hss: The trace and reference originate from the same source. H2/Hds: The trace and reference originate from different sources

    A Study of Age and Ageing in Fingerprint Biometrics

    No full text

    A Study of Age and Ageing in Fingerprint Biometrics

    No full text
    Thanks to Mr James Bond we are aware that diamonds are forever but, are fingerprints? It is well known that biometrics brings to the security field a new paradigm: unlike traditional systems, individuals are not identified by something that they have or they know, but by what they are. While such an approach entails some clear advantages, an important question remains: Is what we are today the same as what we will be tomorrow? The present paper addresses such a key problem in the fingerprint modality based on a database of over 400K impressions coming from more than 250K different fingers. The database was acquired under real operational conditions and contains fingerprints from subjects aged between 0 and 98 years. Fingerprint pairs were collected with a time difference which ranges between 0 and 7 years. Such a unique set of data has allowed us to analyse both the age and ageing effects, shedding some new light into issues like fingerprint permanence and fingerprint quality.JRC.E.3-Cyber and Digital Citizens' Securit

    Fingerprint Growth Model for Mitigating the Ageing Effect on Children’s Fingerprints Matching

    No full text
    Nowadays, the majority of fingerprint quality, matching and feature extraction algorithms are developed and trained on fingerprints of adults. Accordingly, the processing of children’s fingerprints presents performance issues derived for the most part from: 1) their smaller size and finer ridge structure; 2) their higher variability over time due to the displacement of minutiae induced by growth. The present article is focused on the second factor. The rapid growth of children fingerprints causes a significant displacement of the minutiae points between samples of the same finger acquired with a few years distance from each other. This displacement results in a decrease of the accuracy of fingerprint recognition systems when the reference and probe sample drift apart in time. This effect is known as biometric ageing. In the present study, we propose to address this issue by developing and validating a minutiae-based growth model, derived from a database of over 60,000 children’s fingerprints, acquired in real operational conditions, ranging between 5 and 16 years of age, with a time difference between fingerprint pairs re-enrolments of up to 6 years. We analyze two potential application scenarios for the developed growth model. On one hand, we use the model to grow children’s fingerprints in order to spread out the minutiae points to attain sizes similar to those of a sample captured at a later point in time. On the other hand, we apply the model to rejuvenate fingerprints enrolled at a later stage by contracting the minutiae points so that their location is more similar to those of a sample acquired earlier. In both scenarios, the application of the growth model to produce artificially grown/rejuvenated fingerprint minutiae templates results in a significant improvement of the matching scores compared to the ones produced by original fingerprints.JRC.E.3-Cyber and Digital Citizens' Securit
    corecore